Introductory example
Info: this guide gives you an overview of the typical optimization workflow with mrMBO. For a much more detailed introduction see the next chapter.
Here we provide a quickstart example for you to make yourself familiar with mlrMBO. We aim to maximize a one dimensional mixed cosine function using model-based optimization. Instead of writing this function by hand, we make use of the smoof package, which offers a lot of common single objective optimization functions.
library(smoof)
library(mlr)
library(mlrMBO)
library(ParamHelpers)
obj.fun = makeCosineMixtureFunction(1)
plot(obj.fun)
We decide to use kriging as our surrogate model and to do 10 sequential optimization steps. Furthermore we use Expected Improvement (EI) as the infill criterion, i. e., the criterion which determines which point(s) of the objective function should be evaluated in each iterations (keep in mind, that using EI as the infill criterion needs the learner to support standard error estimation).
As a last step we have to generate an initial design on which we evaluate our model in the beginning. We use ParamHelpers::generateDesign to generate 10 points in a latin hypercube design.
learner = makeLearner("regr.km", predict.type = "se", covtype = "matern3_2",
control = list(trace = FALSE))
control = makeMBOControl()
control = setMBOControlTermination(control, iters = 10)
control = setMBOControlInfill(control, crit = "ei")
design = generateDesign(n = 10, par.set = getParamSet(obj.fun))
Finally we start the optimization process and print the result object.
result = mbo(obj.fun, design = design, learner = learner, control = control,
show.info = TRUE)
#> Computing y column(s) for design. Not provided.
#> [mbo] 0: x=0.815 : y = -0.762 : 0.0 secs : initdesign
#> [mbo] 0: x=0.534 : y = -0.234 : 0.0 secs : initdesign
#> [mbo] 0: x=0.0531 : y = -0.07 : 0.0 secs : initdesign
#> [mbo] 0: x=-0.323 : y = -0.14 : 0.0 secs : initdesign
#> [mbo] 0: x=0.397 : y = -0.257 : 0.0 secs : initdesign
#> [mbo] 0: x=0.608 : y = -0.27 : 0.0 secs : initdesign
#> [mbo] 0: x=-0.477 : y = -0.263 : 0.0 secs : initdesign
#> [mbo] 0: x=-0.76 : y = -0.658 : 0.0 secs : initdesign
#> [mbo] 0: x=-0.15 : y = 0.0484 : 0.0 secs : initdesign
#> [mbo] 0: x=-0.853 : y = -0.795 : 0.0 secs : initdesign
#> [mbo] 1: x=-0.0945 : y = -0.0175 : 0.0 secs : infill_ei
#> [mbo] 2: x=-0.184 : y = 0.063 : 0.0 secs : infill_ei
#> [mbo] 3: x=-0.175 : y = 0.0617 : 0.0 secs : infill_ei
#> [mbo] 4: x=-0.201 : y = 0.0596 : 0.0 secs : infill_ei
#> [mbo] 5: x=-0.19 : y = 0.0627 : 0.0 secs : infill_ei
#> [mbo] 6: x=-0.186 : y = 0.063 : 0.0 secs : infill_ei
#> [mbo] 7: x=-0.181 : y = 0.0628 : 0.0 secs : infill_ei
#> [mbo] 8: x=-0.185 : y = 0.063 : 0.0 secs : infill_ei
#> [mbo] 9: x=0.185 : y = 0.063 : 0.0 secs : infill_ei
#> [mbo] 10: x=0.164 : y = 0.0573 : 0.0 secs : infill_ei
print(result)
#> Recommended parameters:
#> x=-0.185
#> Objective: y = 0.063
#>
#> Optimization path
#> 10 + 10 entries in total, displaying last 10 (or less):
#> x y dob eol error.message exec.time ei
#> 11 -0.09451403 -0.01753958 1 NA <NA> 0 -1.409904e-02
#> 12 -0.18371763 0.06299489 2 NA <NA> 0 -1.027367e-02
#> 13 -0.17468457 0.06168251 3 NA <NA> 0 -1.182718e-03
#> 14 -0.20087432 0.05964008 4 NA <NA> 0 -7.438807e-04
#> 15 -0.18951993 0.06273026 5 NA <NA> 0 -3.341689e-04
#> 16 -0.18598178 0.06299621 6 NA <NA> 0 -8.658221e-05
#> 17 -0.18101246 0.06281954 7 NA <NA> 0 -4.220135e-05
#> 18 -0.18486936 0.06301220 8 NA <NA> 0 -2.866073e-05
#> 19 0.18491046 0.06301218 9 NA <NA> 0 -1.066795e-05
#> 20 0.16351890 0.05728703 10 NA <NA> 0 -3.113826e-03
#> error.model train.time prop.type propose.time
#> 11 <NA> 0.073 infill_ei 0.433
#> 12 <NA> 0.028 infill_ei 0.448
#> 13 <NA> 0.029 infill_ei 0.409
#> 14 <NA> 0.059 infill_ei 0.398
#> 15 <NA> 0.026 infill_ei 0.421
#> 16 <NA> 0.030 infill_ei 0.424
#> 17 <NA> 0.053 infill_ei 0.455
#> 18 <NA> 0.038 infill_ei 0.527
#> 19 <NA> 0.038 infill_ei 0.450
#> 20 <NA> 0.046 infill_ei 0.446
Example run
There is also the function exampleRun, which is useful to figure out how mbo works and to visualize the results.
ex = exampleRun(obj.fun, learner = learner, control = control, show.info = FALSE)
print(ex)
#> MBOExampleRun
#> Number of parameters : 1
#> Parameter names : x
#> Parameter types : numericvector
#> Global Opt (known) : -1.0000e-01
#> Gap for best point : 1.6301e-01
#> True points per dim. : 50
#> Objectives : 1
#> Points proposed per iter : 1
#>
#> Infill criterion : ei
#> Infill optimizer : focussearch
#> Infill optimizer restarts : 1
#> Final point by : best.true.y
#> Learner : regr.km
#> Learner settings:
#> jitter=FALSE,covtype=matern3_2,control=<list>
#> Recommended parameters:
#> x=0.186
#> Objective: y = 6.301e-02
plotExampleRun(ex, iters = c(1L, 3L, 10L))
Or alternatively for a two dimensional function:
obj.fun2 = makeCosineMixtureFunction(2L)
plot(obj.fun2)
ex2 = exampleRun(obj.fun2, learner = learner, control = control, show.info = FALSE)
print(ex2)
#> MBOExampleRun
#> Number of parameters : 2
#> Parameter names : x1,x2
#> Parameter types : numericvector
#> Global Opt (known) : -2.0000e-01
#> Gap for best point : 3.2443e-01
#> True points per dim. : 50
#> Objectives : 1
#> Points proposed per iter : 1
#>
#> Infill criterion : ei
#> Infill optimizer : focussearch
#> Infill optimizer restarts : 1
#> Final point by : best.true.y
#> Learner : regr.km
#> Learner settings:
#> jitter=FALSE,covtype=matern3_2,control=<list>
#> Recommended parameters:
#> x=-0.196,-0.186
#> Objective: y = 1.244e-01
plotExampleRun(ex2, iters = c(1L, 3L, 10L))
#> Loading required package: gridExtra